217 research outputs found
Lower Bounds for the Complexity of the Voronoi Diagram of Polygonal Curves under the Discrete Frechet Distance
We give lower bounds for the combinatorial complexity of the Voronoi diagram
of polygonal curves under the discrete Frechet distance. We show that the
Voronoi diagram of n curves in R^d with k vertices each, has complexity
Omega(n^{dk}) for dimension d=1,2 and Omega(n^{d(k-1)+2}) for d>2.Comment: 6 pages, 2 figure
Locally Correct Frechet Matchings
The Frechet distance is a metric to compare two curves, which is based on
monotonous matchings between these curves. We call a matching that results in
the Frechet distance a Frechet matching. There are often many different Frechet
matchings and not all of these capture the similarity between the curves well.
We propose to restrict the set of Frechet matchings to "natural" matchings and
to this end introduce locally correct Frechet matchings. We prove that at least
one such matching exists for two polygonal curves and give an O(N^3 log N)
algorithm to compute it, where N is the total number of edges in both curves.
We also present an O(N^2) algorithm to compute a locally correct discrete
Frechet matching
Four Soviets Walk the Dog-Improved Bounds for Computing the Fr\'echet Distance
Given two polygonal curves in the plane, there are many ways to define a
notion of similarity between them. One popular measure is the Fr\'echet
distance. Since it was proposed by Alt and Godau in 1992, many variants and
extensions have been studied. Nonetheless, even more than 20 years later, the
original algorithm by Alt and Godau for computing the Fr\'echet
distance remains the state of the art (here, denotes the number of edges on
each curve). This has led Helmut Alt to conjecture that the associated decision
problem is 3SUM-hard.
In recent work, Agarwal et al. show how to break the quadratic barrier for
the discrete version of the Fr\'echet distance, where one considers sequences
of points instead of polygonal curves. Building on their work, we give a
randomized algorithm to compute the Fr\'echet distance between two polygonal
curves in time on a pointer machine
and in time on a word RAM. Furthermore, we show that
there exists an algebraic decision tree for the decision problem of depth
, for some . We believe that this
reveals an intriguing new aspect of this well-studied problem. Finally, we show
how to obtain the first subquadratic algorithm for computing the weak Fr\'echet
distance on a word RAM.Comment: 34 pages, 15 figures. A preliminary version appeared in SODA 201
Progressive Simplification of Polygonal Curves
Simplifying polygonal curves at different levels of detail is an important
problem with many applications. Existing geometric optimization algorithms are
only capable of minimizing the complexity of a simplified curve for a single
level of detail. We present an -time algorithm that takes a polygonal
curve of n vertices and produces a set of consistent simplifications for m
scales while minimizing the cumulative simplification complexity. This
algorithm is compatible with distance measures such as the Hausdorff, the
Fr\'echet and area-based distances, and enables simplification for continuous
scaling in time. To speed up this algorithm in practice, we present
new techniques for constructing and representing so-called shortcut graphs.
Experimental evaluation of these techniques on trajectory data reveals a
significant improvement of using shortcut graphs for progressive and
non-progressive curve simplification, both in terms of running time and memory
usage.Comment: 20 pages, 20 figure
A Spanner for the Day After
We show how to construct -spanner over a set of
points in that is resilient to a catastrophic failure of nodes.
Specifically, for prescribed parameters , the
computed spanner has edges, where . Furthermore, for any , and
any deleted set of points, the residual graph is -spanner for all the points of except for
of them. No previous constructions, beyond the trivial clique
with edges, were known such that only a tiny additional fraction
(i.e., ) lose their distance preserving connectivity.
Our construction works by first solving the exact problem in one dimension,
and then showing a surprisingly simple and elegant construction in higher
dimensions, that uses the one-dimensional construction in a black box fashion
Approximating the Distribution of the Median and other Robust Estimators on Uncertain Data
Robust estimators, like the median of a point set, are important for data
analysis in the presence of outliers. We study robust estimators for
locationally uncertain points with discrete distributions. That is, each point
in a data set has a discrete probability distribution describing its location.
The probabilistic nature of uncertain data makes it challenging to compute such
estimators, since the true value of the estimator is now described by a
distribution rather than a single point. We show how to construct and estimate
the distribution of the median of a point set. Building the approximate support
of the distribution takes near-linear time, and assigning probability to that
support takes quadratic time. We also develop a general approximation technique
for distributions of robust estimators with respect to ranges with bounded VC
dimension. This includes the geometric median for high dimensions and the
Siegel estimator for linear regression.Comment: Full version of a paper to appear at SoCG 201
Vectors in a Box
For an integer d>=1, let tau(d) be the smallest integer with the following
property: If v1,v2,...,vt is a sequence of t>=2 vectors in [-1,1]^d with
v1+v2+...+vt in [-1,1]^d, then there is a subset S of {1,2,...,t} of indices,
2<=|S|<=tau(d), such that \sum_{i\in S} vi is in [-1,1]^d. The quantity tau(d)
was introduced by Dash, Fukasawa, and G\"unl\"uk, who showed that tau(2)=2,
tau(3)=4, and tau(d)=Omega(2^d), and asked whether tau(d) is finite for all d.
Using the Steinitz lemma, in a quantitative version due to Grinberg and
Sevastyanov, we prove an upper bound of tau(d) <= d^{d+o(d)}, and based on a
construction of Alon and Vu, whose main idea goes back to Hastad, we obtain a
lower bound of tau(d)>= d^{d/2-o(d)}.
These results contribute to understanding the master equality polyhedron with
multiple rows defined by Dash et al., which is a "universal" polyhedron
encoding valid cutting planes for integer programs (this line of research was
started by Gomory in the late 1960s). In particular, the upper bound on tau(d)
implies a pseudo-polynomial running time for an algorithm of Dash et al. for
integer programming with a fixed number of constraints. The algorithm consists
in solving a linear program, and it provides an alternative to a 1981 dynamic
programming algorithm of Papadimitriou.Comment: 12 pages, 1 figur
Interference Minimization in Asymmetric Sensor Networks
A fundamental problem in wireless sensor networks is to connect a given set
of sensors while minimizing the \emph{receiver interference}. This is modeled
as follows: each sensor node corresponds to a point in and each
\emph{transmission range} corresponds to a ball. The receiver interference of a
sensor node is defined as the number of transmission ranges it lies in. Our
goal is to choose transmission radii that minimize the maximum interference
while maintaining a strongly connected asymmetric communication graph.
For the two-dimensional case, we show that it is NP-complete to decide
whether one can achieve a receiver interference of at most . In the
one-dimensional case, we prove that there are optimal solutions with nontrivial
structural properties. These properties can be exploited to obtain an exact
algorithm that runs in quasi-polynomial time. This generalizes a result by Tan
et al. to the asymmetric case.Comment: 15 pages, 5 figure
- …